# Contrastive Learning Training
Sci Rus Tiny
MIT
SciRus-tiny is a compact model for obtaining Russian and English scientific text embeddings, trained on eLibrary data using contrastive learning techniques.
Text Embedding
Transformers Supports Multiple Languages

S
mlsa-iai-msu-lab
369
12
Lyrics Bert
A sentence transformer model trained on 480,000 English lyrics for generating 300-dimensional sentence embeddings
Text Embedding
Transformers

L
brunokreiner
568
2
All Datasets V3 Mpnet Base
Apache-2.0
Sentence embedding model based on MPNet architecture, mapping text to a 768-dimensional vector space, suitable for semantic search and sentence similarity calculation
Text Embedding English
A
flax-sentence-embeddings
3,472
13
Featured Recommended AI Models